Tensor Methods for Large, Sparse Unconstrained Optimization

نویسنده

  • Ali Bouaricha
چکیده

Tensor methods for unconstrained optimization were rst introduced by Schn-abel and Chow SIAM J. Optimization, 1 (1991), pp. 293-315], who describe these methods for small to moderate-size problems. The major contribution of this paper is the extension of these methods to large, sparse unconstrained optimization problems. This extension requires an entirely new way of solving the tensor model that makes the methods suitable for solving large, sparse optimization problems eeciently. We present test results for sets of problems where the Hessian at the minimizer is nonsingular and where it is singular. These results show that tensor methods are signiicantly more eecient and more reliable than standard methods based on Newton's method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Software Package for Large Sparse Unconstrained Optimization using Tensor Methods

We describe a new package for minimizing an unconstrained nonlinear function, where the Hessian is large and sparse. The software allows the user to select between a tensor method and a standard method based upon a quadratic model. The tensor method models the objective function by a fourth order model, where the third and fourth order terms are chosen such that the extra cost of forming and so...

متن کامل

A limited memory adaptive trust-region approach for large-scale unconstrained optimization

This study concerns with a trust-region-based method for solving unconstrained optimization problems. The approach takes the advantages of the compact limited memory BFGS updating formula together with an appropriate adaptive radius strategy. In our approach, the adaptive technique leads us to decrease the number of subproblems solving, while utilizing the structure of limited memory quasi-Newt...

متن کامل

Optimization of unconstrained functions with sparse Hessian matrices - Quasi-Newton methods

Newton-type methods and quasi-Newton methods have proven to be very successful in solving dense unconstrained optimization problems. Recently there has been considerable interest in extending these methods to solving large problems when the Hessian matrix has a known a priori sparsity pattern, This paper treats sparse quasi-Newton methods in a uniform fashion and shows the effect of loss of pos...

متن کامل

A New Hybrid Conjugate Gradient Method Based on Eigenvalue Analysis for Unconstrained Optimization Problems

In this paper‎, ‎two extended three-term conjugate gradient methods based on the Liu-Storey ({tt LS})‎ ‎conjugate gradient method are presented to solve unconstrained optimization problems‎. ‎A remarkable property of the proposed methods is that the search direction always satisfies‎ ‎the sufficient descent condition independent of line search method‎, ‎based on eigenvalue analysis‎. ‎The globa...

متن کامل

Optimization of unconstrained functions with sparse hessian matrices-newton-type methods

Newton-type methods for unconstrained optimization problems have been very successful when coupled with a modified Cholesky factorization to take into account the possible lack of positivedefiniteness in the Hessian matrix. In this paper we discuss the application of these methods to large problems that have a sparse Hessian matrix whose sparsity is known a priori. Quite often it is difficult, ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • SIAM Journal on Optimization

دوره 7  شماره 

صفحات  -

تاریخ انتشار 1997